
Simultaneous and Meshfree Topology Optimization with Physics-informed Gaussian Processes
Please login to view abstract download link
Topology optimization (TO) provides a principled mathematical approach for optimizing structural performance. The majority of existing TO approaches leverage numerical solvers for design evaluations which means that they (1) have a nested nature, and (2) rely on discretizing the design and state variables. Contrary to these approaches, herein we develop a new class of TO methods based on the framework of Gaussian processes (GPs) whose mean functions are parameterized via deep neural networks. Specifically, we place GP priors on all design and state variables to represent them via parameterized continuous functions. These GPs share a deep neural network as their mean function but have as many independent kernels as there are state and design variables. We estimate all the parameters of our model in a single loop that optimizes a penalized version of the performance metric where the penalty terms correspond to the state equations and design constraints. Attractive features of our approach include (1) having a built-in continuation nature, and (2) being discretization-invariant and accommodating complex domains and topologies. To test our method against commercial software, we evaluate it on four problems involving the minimization of dissipated power in Stokes flow.